2,909 research outputs found
The perceived quality of process discovery tools
Process discovery has seen a rise in popularity in the last decade for both
researchers and businesses. Recent developments mainly focused on the power and
the functionalities of the discovery algorithm. While continuous improvement of
these functional aspects is very important, non-functional aspects such as
visualization and usability are often overlooked. However, these aspects are
considered valuable for end-users and play an important part in the experience
of these end-users when working with a process discovery tool. A questionnaire
has been sent out to give end-users the opportunity to voice their opinion on
available process discovery tools and about the state of process discovery as a
domain in general. The results of 66 respondents are presented and compared
with the answers of 63 respondents that were contacted through one particular
software vendor's employee and customer base (i.e., Celonis)
The Structured Process Modeling Theory (SPMT): a cognitive view on why and how modelers benefit from structuring the process of process modeling
After observing various inexperienced modelers constructing a business process model based on the same textual case description, it was noted that great differences existed in the quality of the produced models. The impression arose that certain quality issues originated from cognitive failures during the modeling process. Therefore, we developed an explanatory theory that describes the cognitive mechanisms that affect effectiveness and efficiency of process model construction: the Structured Process Modeling Theory (SPMT). This theory states that modeling accuracy and speed are higher when the modeler adopts an (i) individually fitting (ii) structured (iii) serialized process modeling approach. The SPMT is evaluated against six theory quality criteria
The Structured Process Modeling Method (SPMM) : what is the best way for me to construct a process model?
More and more organizations turn to the construction of process models to support strategical and operational tasks. At the same time, reports indicate quality issues for a considerable part of these models, caused by modeling errors. Therefore, the research described in this paper investigates the development of a practical method to determine and train an optimal process modeling strategy that aims to decrease the number of cognitive errors made during modeling. Such cognitive errors originate in inadequate cognitive processing caused by the inherent complexity of constructing process models. The method helps modelers to derive their personal cognitive profile and the related optimal cognitive strategy that minimizes these cognitive failures. The contribution of the research consists of the conceptual method and an automated modeling strategy selection and training instrument. These two artefacts are positively evaluated by a laboratory experiment covering multiple modeling sessions and involving a total of 149 master students at Ghent University
Scenario driven requirement engineering for design and deployment of mobile communication networks
The numbers of users and usage of mobile data service are increasing dramatically due to the introduction of smartphones and mobile broadband dongles. For the next decade the mobile broadband market is expected to grow and reach a level where the average data consumption per user is orders of magnitude greater than today. For the telecom industry it is a magnificent challenge to design and deploy these s high-capacity wireless networks taking into account limitations in cost, energy and radio spectrum. The objective of this paper is to highlight the need to consider a multitude of scenarios for the requirements, design and deployment of mobile broad band networks. The R&D has for many years been targeting high peak data rates enabled by improved spectral efficiency, adding more spectrum bands, aggregation of frequency bands and offloading to local wireless networks connected via public fixed phones or broadband. However, many of these features driving the technology development are representative for the conditions in US and Western Europe. The wireless networks also need to be designed assuming deployment in regions in the world where both the availability of spectrum as well as the penetration of fixed phones and broadband are limited. --Mobile broadband networks,cost and capacity,spectrum,deployment strategies,telecommunications,management of technology and R&D,economic development of natural resources
An overview of process model quality literature - The Comprehensive Process Model Quality Framework
The rising interest in the construction and the quality of (business) process models resulted in an abundancy of emerged research studies and different findings about process model quality. The lack of overview and the lack of consensus hinder the development of the research field. The research objective is to collect, analyse, structure, and integrate the existing knowledge in a comprehensive framework that strives to find a balance between completeness and relevance without hindering the overview. The Systematic Literature Review methodology was applied to collect the relevant studies. Because several studies exist that each partially addresses this research objective, the review was performed at a tertiary level. Based on a critical analysis of the collected papers, a comprehensive, but structured overview of the state of the art in the field was composed. The existing academic knowledge about process model quality was carefully integrated and structured into the Comprehensive Process Model Quality Framework (CPMQF). The framework summarizes 39 quality dimensions, 21 quality metrics, 28 quality (sub)drivers, 44 (sub)driver metrics, 64 realization initiatives and 15 concrete process model purposes related to 4 types of organizational benefits, as well as the relations between all of these. This overview is thus considered to form a valuable instrument for both researchers and practitioners that are concerned about process model quality. The framework is the first to address the concept of process model quality in such a comprehensive way
The 'physics of diagrams' : revealing the scientific basis of graphical representation design
Data is omnipresent in the modern, digital world and a significant number of
people need to make sense of data as part of their everyday social and
professional life. Therefore, together with the rise of data, the design of
graphical representations has gained importance and attention. Yet, although a
large body of procedural knowledge about effective visualization exists, the
quality of representations is often reported to be poor, proposedly because
these guidelines are scattered, unstructured and sometimes perceived as
contradictive. Therefore, this paper describes a literature research addressing
these problems. The research resulted in the collection and structuring of 81
guidelines and 34 underlying propositions, as well as in the derivation of 7
foundational principles about graphical representation design, called the
"Physics of Diagrams", which are illustrated with concrete, practical examples
throughout the paper
Cognitive aspects of structured process modeling
After visualizing data of various observational experiments on the way in which modelers construct process models, a promising process modeling style (i.e., structured process modeling) was discovered that is expected to cause process model quality to increase. A modeler constructs process models in a structured way if she/he is working on few parts of the model simultaneously. This paper describes cognitive theories that can explain this causal relation. Cognitive Load Theory (CLT) suggests that the amount of errors increases when the limited capacity of our working memory is overloaded. Cognitive Fit Theory (CFT) states that performance is improved when task material representation matches with the task to be executed. Three hypotheses are formulated and the experimental set-up to evaluate these hypotheses is described
- …